Predictive stability criteria for penalty selection in linear models
نویسندگان
چکیده
Abstract Choosing a shrinkage method can be done by selecting penalty from list of pre-specified penalties or constructing based on the data. If for class linear models is given, we introduce predictive stability criterion data perturbation to select list. Simulation studies show that our identifies methods usually agree with existing literature and help explain heuristically when given expected perform well. preference construct customized problem, then propose technique genetic algorithms, again using criterion. We find that, in general, custom never performs worse than any commonly used there are cases reduces recognizable penalty. Since selection mathematically equivalent prior selection, also constructs priors. Our methodology allows us observe oracle property typically holds satisfy basic regularity conditions therefore not restrictive enough play direct role selection. In addition, methodology, immediately applied real problems, permits take model mis-specification into account.
منابع مشابه
Adaptive Bayesian Criteria in Variable Selection for Generalized Linear Models
For the problem of variable selection in generalized linear models, we develop various adaptive Bayesian criteria. Using a hierarchical mixture setup for model uncertainty, combined with an integrated Laplace approximation, we derive Empirical Bayes and Fully Bayes criteria that can be computed easily and quickly. The performance of these criteria is assessed via simulation and compared to othe...
متن کاملMinimum Description Length Model Selection Criteria for Generalized Linear Models
This paper derives several model selection criteria for generalized linear models (GLMs) following the principle of Minimum Description Length (MDL). We focus our attention on the mixture form of MDL. Normal or normal-inverse gamma distributions are used to construct the mixtures, depending on whether or not we choose to account for possible over-dispersion in the data. For the latter, we use E...
متن کاملVariable selection in linear regression through adaptive penalty selection
Model selection procedures often use a fixed penalty, such as Mallows’ Cp, to avoid choosing a model which fits a particular data set extremely well. These procedures are often devised to give an unbiased risk estimate when a particular chosen model is used to predict future responses. As a correction for not including the variability induced in model selection, generalized degrees of freedom i...
متن کاملUnderstanding predictive information criteria for Bayesian models
We review the Akaike, deviance, and Watanabe-Akaike information criteria from a Bayesian perspective, where the goal is to estimate expected out-of-sample-prediction error using a biascorrected adjustment of within-sample error. We focus on the choices involved in setting up these measures, and we compare them in three simple examples, one theoretical and two applied. The contribution of this p...
متن کاملVariable selection and estimation in generalized linear models with the seamless L0 penalty.
In this paper, we propose variable selection and estimation in generalized linear models using the seamless L0 (SELO) penalized likelihood approach. The SELO penalty is a smooth function that very closely resembles the discontinuous L0 penalty. We develop an e cient algorithm to fit the model, and show that the SELO-GLM procedure has the oracle property in the presence of a diverging number of ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Computational Statistics
سال: 2023
ISSN: ['0943-4062', '1613-9658']
DOI: https://doi.org/10.1007/s00180-023-01342-8